Regularizing Graph Neural Networks via Consistency-Diversity Graph Augmentations

نویسندگان

چکیده

Despite the remarkable performance of graph neural networks (GNNs) in semi-supervised learning, it is criticized for not making full use unlabeled data and suffering from over-fitting. Recently, augmentation, used to improve both accuracy generalization GNNs, has received considerable attentions. However, one fundamental question how evaluate quality augmentations principle? In this paper, we propose two metrics, Consistency Diversity, aspects augmentation correctness generalization. Moreover, discover that existing fall into a dilemma between these metrics. Can find satisfying consistency diversity? A well-informed answer can help us understand mechanism behind GNNs. To tackle challenge, analyze representative learning algorithms: label propagation (LP) regularization (CR). We LP utilizes prior knowledge graphs CR adopts variable promote diversity. Based on discovery, treat neighbors as capture embodying homophily assumption, which promises high augmentations. further diversity, randomly replace immediate each node with its remote neighbors. After that, neighbor-constrained proposed enforce predictions augmented be consistent other. Extensive experiments five real-world validate superiority our method improving

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Regularizing graph centrality computations

Centrality metrics such as betweenness and closeness have been used to identify important nodes in a network. However, it takes days to months on a high-end workstation to compute the centrality of today’s networks. The main reasons are the size and the irregular structure of these networks. While today’s computing units excel at processing dense and regular data, their performance is questiona...

متن کامل

Graph Convolutional Neural Networks via Scattering

We generalize the scattering transform to graphs and consequently construct a convolutional neural network on graphs. We show that under certain conditions, any feature generated by such a network is approximately invariant to permutations and stable to graph manipulations. Numerical results demonstrate competitive performance on relevant datasets.

متن کامل

Regularizing Neural Networks via Retaining Confident Connections

Regularization of neural networks can alleviate overfitting in the training phase. Current regularization methods, such as Dropout and DropConnect, randomly drop neural nodes or connections based on a uniform prior. Such a data-independent strategy does not take into consideration of the quality of individual unit or connection. In this paper, we aim to develop a data-dependent approach to regu...

متن کامل

Improved Graph Laplacian via Geometric Self-Consistency

We address the problem of setting the kernel bandwidth used by Manifold Learning algorithms to construct the graph Laplacian. Exploiting the connection between manifold geometry, represented by the Riemannian metric, and the Laplace-Beltrami operator, we set by optimizing the Laplacian’s ability to preserve the geometry of the data. Experiments show that this principled approach is effective an...

متن کامل

Convolutional Neural Networks Via Node-Varying Graph Filters

Convolutional neural networks (CNNs) are being applied to an increasing number of problems and fields due to their superior performance in classification and regression tasks. Since two of the key operations that CNNs implement are convolution and pooling, this type of networks is implicitly designed to act on data described by regular structures such as images. Motivated by the recent interest...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2022

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v36i4.20307